35 research outputs found
Canard explosion in delayed equations with multiple timescales
We analyze canard explosions in delayed differential equations with a
one-dimensional slow manifold. This study is applied to explore the dynamics of
the van der Pol slow-fast system with delayed self-coupling. In the absence of
delays, this system provides a canonical example of a canard explosion. We show
that as the delay is increased a family of `classical' canard explosions ends
as a Bogdanov-Takens bifurcation occurs at the folds points of the S-shaped
critical manifold.Comment: arXiv admin note: substantial text overlap with arXiv:1404.584
On the dynamics of random neuronal networks
We study the mean-field limit and stationary distributions of a pulse-coupled
network modeling the dynamics of a large neuronal assemblies. Our model takes
into account explicitly the intrinsic randomness of firing times, contrasting
with the classical integrate-and-fire model. The ergodicity properties of the
Markov process associated to finite networks are investigated. We derive the
limit in distribution of the sample path of the state of a neuron of the
network when its size gets large. The invariant distributions of this limiting
stochastic process are analyzed as well as their stability properties. We show
that the system undergoes transitions as a function of the averaged
connectivity parameter, and can support trivial states (where the network
activity dies out, which is also the unique stationary state of finite networks
in some cases) and self-sustained activity when connectivity level is
sufficiently large, both being possibly stable.Comment: 37 pages, 3 figure
Noise-induced synchronization and anti-resonance in excitable systems; Implications for information processing in Parkinson's Disease and Deep Brain Stimulation
We study the statistical physics of a surprising phenomenon arising in large
networks of excitable elements in response to noise: while at low noise,
solutions remain in the vicinity of the resting state and large-noise solutions
show asynchronous activity, the network displays orderly, perfectly
synchronized periodic responses at intermediate level of noise. We show that
this phenomenon is fundamentally stochastic and collective in nature. Indeed,
for noise and coupling within specific ranges, an asymmetry in the transition
rates between a resting and an excited regime progressively builds up, leading
to an increase in the fraction of excited neurons eventually triggering a chain
reaction associated with a macroscopic synchronized excursion and a collective
return to rest where this process starts afresh, thus yielding the observed
periodic synchronized oscillations. We further uncover a novel anti-resonance
phenomenon: noise-induced synchronized oscillations disappear when the system
is driven by periodic stimulation with frequency within a specific range. In
that anti-resonance regime, the system is optimal for measures of information
capacity. This observation provides a new hypothesis accounting for the
efficiency of Deep Brain Stimulation therapies in Parkinson's disease, a
neurodegenerative disease characterized by an increased synchronization of
brain motor circuits. We further discuss the universality of these phenomena in
the class of stochastic networks of excitable elements with confining coupling,
and illustrate this universality by analyzing various classical models of
neuronal networks. Altogether, these results uncover some universal mechanisms
supporting a regularizing impact of noise in excitable systems, reveal a novel
anti-resonance phenomenon in these systems, and propose a new hypothesis for
the efficiency of high-frequency stimulation in Parkinson's disease
A constructive mean field analysis of multi population neural networks with random synaptic weights and stochastic inputs
We deal with the problem of bridging the gap between two scales in neuronal
modeling. At the first (microscopic) scale, neurons are considered individually
and their behavior described by stochastic differential equations that govern
the time variations of their membrane potentials. They are coupled by synaptic
connections acting on their resulting activity, a nonlinear function of their
membrane potential. At the second (mesoscopic) scale, interacting populations
of neurons are described individually by similar equations. The equations
describing the dynamical and the stationary mean field behaviors are considered
as functional equations on a set of stochastic processes. Using this new point
of view allows us to prove that these equations are well-posed on any finite
time interval and to provide a constructive method for effectively computing
their unique solution. This method is proved to converge to the unique solution
and we characterize its complexity and convergence rate. We also provide
partial results for the stationary problem on infinite time intervals. These
results shed some new light on such neural mass models as the one of Jansen and
Rit \cite{jansen-rit:95}: their dynamics appears as a coarse approximation of
the much richer dynamics that emerges from our analysis. Our numerical
experiments confirm that the framework we propose and the numerical methods we
derive from it provide a new and powerful tool for the exploration of neural
behaviors at different scales.Comment: 55 pages, 4 figures, to appear in "Frontiers in Neuroscience
Finite-size and correlation-induced effects in Mean-field Dynamics
The brain's activity is characterized by the interaction of a very large
number of neurons that are strongly affected by noise. However, signals often
arise at macroscopic scales integrating the effect of many neurons into a
reliable pattern of activity. In order to study such large neuronal assemblies,
one is often led to derive mean-field limits summarizing the effect of the
interaction of a large number of neurons into an effective signal. Classical
mean-field approaches consider the evolution of a deterministic variable, the
mean activity, thus neglecting the stochastic nature of neural behavior. In
this article, we build upon two recent approaches that include correlations and
higher order moments in mean-field equations, and study how these stochastic
effects influence the solutions of the mean-field equations, both in the limit
of an infinite number of neurons and for large yet finite networks. We
introduce a new model, the infinite model, which arises from both equations by
a rescaling of the variables and, which is invertible for finite-size networks,
and hence, provides equivalent equations to those previously derived models.
The study of this model allows us to understand qualitative behavior of such
large-scale networks. We show that, though the solutions of the deterministic
mean-field equation constitute uncorrelated solutions of the new mean-field
equations, the stability properties of limit cycles are modified by the
presence of correlations, and additional non-trivial behaviors including
periodic orbits appear when there were none in the mean field. The origin of
all these behaviors is then explored in finite-size networks where interesting
mesoscopic scale effects appear. This study leads us to show that the
infinite-size system appears as a singular limit of the network equations, and
for any finite network, the system will differ from the infinite system
A Markovian event-based framework for stochastic spiking neural networks
In spiking neural networks, the information is conveyed by the spike times,
that depend on the intrinsic dynamics of each neuron, the input they receive
and on the connections between neurons. In this article we study the Markovian
nature of the sequence of spike times in stochastic neural networks, and in
particular the ability to deduce from a spike train the next spike time, and
therefore produce a description of the network activity only based on the spike
times regardless of the membrane potential process.
To study this question in a rigorous manner, we introduce and study an
event-based description of networks of noisy integrate-and-fire neurons, i.e.
that is based on the computation of the spike times. We show that the firing
times of the neurons in the networks constitute a Markov chain, whose
transition probability is related to the probability distribution of the
interspike interval of the neurons in the network. In the cases where the
Markovian model can be developed, the transition probability is explicitly
derived in such classical cases of neural networks as the linear
integrate-and-fire neuron models with excitatory and inhibitory interactions,
for different types of synapses, possibly featuring noisy synaptic integration,
transmission delays and absolute and relative refractory period. This covers
most of the cases that have been investigated in the event-based description of
spiking deterministic neural networks
Limits and dynamics of stochastic neuronal networks with random heterogeneous delays
Realistic networks display heterogeneous transmission delays. We analyze here
the limits of large stochastic multi-populations networks with stochastic
coupling and random interconnection delays. We show that depending on the
nature of the delays distributions, a quenched or averaged propagation of chaos
takes place in these networks, and that the network equations converge towards
a delayed McKean-Vlasov equation with distributed delays. Our approach is
mostly fitted to neuroscience applications. We instantiate in particular a
classical neuronal model, the Wilson and Cowan system, and show that the
obtained limit equations have Gaussian solutions whose mean and standard
deviation satisfy a closed set of coupled delay differential equations in which
the distribution of delays and the noise levels appear as parameters. This
allows to uncover precisely the effects of noise, delays and coupling on the
dynamics of such heterogeneous networks, in particular their role in the
emergence of synchronized oscillations. We show in several examples that not
only the averaged delay, but also the dispersion, govern the dynamics of such
networks.Comment: Corrected misprint (useless stopping time) in proof of Lemma 1 and
clarified a regularity hypothesis (remark 1